Goto

Collaborating Authors

 AAAI AI-Alert for Apr 21, 2020


How utilities are using AI to adapt to electricity demands

#artificialintelligence

The spread of the novel coronavirus that causes COVID-19 has prompted state and local governments around the U.S. to institute shelter-in-place orders and business closures. As millions suddenly find themselves confined to their homes, the shift has strained not only internet service providers, streaming platforms, and online retailers, but the utilities supplying power to the nation's electrical grid, as well. U.S. electricity use on March 27, 2020 was 3% lower than it was on March 27, 2019, a loss of about three years of sales growth. Peter Fox-Penner, director of the Boston University Institute for Sustainable Energy, asserted in a recent op-ed that utility revenues will suffer because providers are halting shutoffs and deferring rate increases. Moreover, according to research firm Wood Mackenzie, the rise in household electricity demand won't offset reduced business electricity demand, mainly because residential demand makes up just 40% of the total demand across North America.


NHS trials AI system to predict coronavirus ventilator demand Verdict

#artificialintelligence

The NHS is turning to artificial intelligence (AI) to help predict upcoming demand for intensive care beds and ventilators during the coronavirus pandemic across England. Trials of the predictive system, known as the COVID 19 Capacity Planning and Analysis System (CPAS), began today at four hospitals. It harnesses the principles of machine learning – algorithms that find and apply patterns in data – to provide statistics, forecasts and simulation environments to the NHS to better plan resources during the pandemic. For example, predictions made by the machine learning system could inform a hospital that capacity will be reached in advance, giving it time to bring in extra resources or share capacity with neighbouring hospitals. If CPAS proves to be accurate, the NHS will look to roll it out across the rest of the country.


New DoE Program Drives Demand For Machine Learning Programmers

#artificialintelligence

Machine learning is leading to numerous changes in the energy industry. The Department of Energy recently announced that it is taking steps to accelerate the integration of machine learning technology in energy research and development. The head of the Department of Energy announced that they will be investing $30 million in artificial intelligence and machine learning algorithms. The new programs will have multiple purposes. One of the biggest goals is to use machine learning to facilitate the development of new renewable energy technologies.

  AI-Alerts: 2020 > 2020-04 > AAAI AI-Alert for Apr 21, 2020 (1.00)
  Country:
  Industry: Energy > Renewable > Wind (0.77)

Why Having a Chief AI Officer Should Matter to HR

#artificialintelligence

Companies using artificial intelligence (AI) across their business units should consider creating a C-suite position to oversee how AI is used and guard against the risk of making bad decisions based on biased algorithms, experts say. Only a few companies, like Levi Strauss & Co, have established a chief artificial intelligence officer (CAIO) position, and fewer have created a C-level position dedicated solely to AI ethics. Brian Kropp, chief of research in the HR practice at Gartner, said chief technology officers and chief information officers will struggle with handling AI-related decisions and ethical dilemmas. "CTOs and CIOs are going to be thinking about the role through the lens of how they can make the technology work," Kropp said. However, "artificial intelligence is not a question of how you get the technology to work; it's a question of how do you think through the implications of the technology?"


Widely Used AI Machine Learning Methods Don't Work as Claimed

#artificialintelligence

Researchers demonstrated the mathematical impossibility of representing social networks and other complex networks using popular methods of'low-dimensional embeddings.' Models and algorithms for analyzing complex networks are widely used in research and affect society at large through their applications in online social networks, search engines, and recommender systems. According to a new study, however, one widely used algorithmic approach for modeling these networks is fundamentally flawed, failing to capture important properties of real-world complex networks. "It's not that these techniques are giving you absolute garbage. They probably have some information in them, but not as much information as many people believe," said C. "Sesh" Seshadhri, associate professor of computer science and engineering in the Baskin School of Engineering at UC Santa Cruz. Seshadhri is first author of a paper on the new findings published on March 2, 2020, in Proceedings of the National Academy of Sciences.


Language may help AI navigate new environments

#artificialintelligence

In a new study published this week on the preprint server Arxiv.org, Both it and several baseline models will soon be available on GitHub. One of the most powerful techniques in machine learning -- reinforcement learning, which entails spurring software agents toward goals via rewards -- is also one of the most flawed. It's sample inefficient, meaning it requires a large number of compute cycles to complete, and without additional data to cover variations, it adapts poorly to environments that differ from the training environment. It's theorized that prior knowledge of tasks through structured language could be combined with reinforcement learning to mitigate its shortcomings, and BabyAI was designed to put this theory to the test.

  AI-Alerts: 2020 > 2020-04 > AAAI AI-Alert for Apr 21, 2020 (1.00)
  Country: North America > Canada > Ontario > Toronto (0.17)
  Genre: Research Report (0.73)

Machine learning algorithm quantifies the impact of quarantine measures on COVID-19's spread

#artificialintelligence

Every day for the past few weeks, charts and graphs plotting the projected apex of COVID-19 infections have been splashed across newspapers and cable news. Many of these models have been built using data from studies on previous outbreaks like SARS or MERS. Now, a team of engineers at MIT has developed a model that uses data from the COVID-19 pandemic in conjunction with a neural network to determine the efficacy of quarantine measures and better predict the spread of the virus. "Our model is the first which uses data from the coronavirus itself and integrates two fields: machine learning and standard epidemiology," explains Raj Dandekar, a Ph.D. candidate studying civil and environmental engineering. Together with George Barbastathis, professor of mechanical engineering, Dandekar has spent the past few months developing the model as part of the final project in class 2.168 (Learning Machines).


OpenAI launches Microscope to visualize the neurons in popular machine learning models

#artificialintelligence

OpenAI today launched Microscope, a library of neuron visualizations starting with nine popular or heavily neural networks. In all, the collection encompasses millions of images. Like a microscope can do in a laboratory, Microscope is made to help AI researchers better understand the architecture and behavior of neural networks with tens of thousands of neurons. Initial models in Microscope include historically important and commonly studied computer vision models like AlexNet, 2012 winner of the now retired ImageNet challenge. AlexNet has been cited over 50,000 times in research.


Uber claims its AI enables driverless cars to predict traffic movement with high accuracy

#artificialintelligence

In a paper published on the preprint server Arxiv.org this week, researchers at Uber's Advanced Technologies Group (ATG) propose an AI technique to improve autonomous vehicles' traffic movement predictions. It's directly applicable to the driverless technologies that Uber itself is developing, which must be able to detect, track, and anticipate surrounding cars' trajectories in order to safely navigate public roads. It's well-understood that without the ability to predict the decisions other drivers on the road might make, vehicles can't be fully autonomous. In a tragic case in point, an Uber self-driving prototype hit and killed a pedestrian in Tempe, Arizona two years ago, partly because the vehicle failed to detect and avoid the victim. ATG's research, then -- which is novel in that it employs a generative adversarial network (GAN) to make car trajectory predictions as opposed to less complex architectures -- promises to advance the state of the art by boosting the precision of predictions by an order of magnitude.


Uber Open-Sources Fiber - A New Library For Distributed Machine Learning

#artificialintelligence

Latest technologies such as machine learning and deep learning require a colossal amount of data to improve its outcomes' accuracy. However, it is nearly impossible for a local computer to process the vast amount of data. As a result, practitioners use distributed computing for obtaining high-computational power to deliver quick and accurate results. However, effectively managing distributed computation is not straightforward, and this causes hindrance in training and evaluating AI models. To address these challenges, Uber has open-sourced its Fiber framework to help researchers and developers streamline their large-scale parallel scientific computation.

  AI-Alerts: 2020 > 2020-04 > AAAI AI-Alert for Apr 21, 2020 (1.00)